AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multilingual Optimization

# Multilingual Optimization

Suzume Llama 3 8B Multilingual Orpo Borda Half
A multilingual large model fine-tuned via the ORPO method based on Llama-3-8B, trained with 50% of the most consistent ranking data, demonstrating excellent performance in various language tasks.
Large Language Model Transformers
S
lightblue
4,625
16
Litlat Bert
LitLat BERT is a trilingual model based on the xlm-roberta-base architecture, focusing on Lithuanian, Latvian, and English performance.
Large Language Model Transformers Supports Multiple Languages
L
EMBEDDIA
937
5
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase